AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Mixed Expert Reasoning

# Mixed Expert Reasoning

Arcana Qwen3 2.4B A0.6B
Apache-2.0
This is a Mixture of Experts (MoE) model based on Qwen3, with a total of 2.4 billion parameters, including four expert models with 0.6 billion parameters each, designed to deliver more accurate results with higher efficiency and lower memory usage.
Large Language Model Transformers Supports Multiple Languages
A
suayptalha
199
21
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase